A Globally Convergent Modified Conjugate-gradient Line-search Algorithm with Inertia Controlling
نویسندگان
چکیده
In this paper we have addressed the problem of unboundedness in the search direction when the Hessian is indefinite or near singular. A new algorithm has been proposed which naturally handles singular Hessian matrices, and is theoretically equivalent to the trust-region approach. This is accomplished by performing explicit matrix modifications adaptively that mimic the implicit modifications used by trust-region methods. Further, we provide a new variant of modified conjugate gradient algorithms which implements this strategy in a robust and efficient way. Numerical results are provided demonstrating the effectiveness of this approach in the context of a line-search method for large-scale unconstrained nonconvex optimization.
منابع مشابه
Extensions of the Hestenes-Stiefel and Polak-Ribiere-Polyak conjugate gradient methods with sufficient descent property
Using search directions of a recent class of three--term conjugate gradient methods, modified versions of the Hestenes-Stiefel and Polak-Ribiere-Polyak methods are proposed which satisfy the sufficient descent condition. The methods are shown to be globally convergent when the line search fulfills the (strong) Wolfe conditions. Numerical experiments are done on a set of CUTEr unconstrained opti...
متن کاملGlobal Convergence of a Modified Liu-storey Conjugate Gradient Method
In this paper, we make a modification to the LS conjugate gradient method and propose a descent LS method. The method can generates sufficient descent direction for the objective function. We prove that the method is globally convergent with an Armijo-type line search. Moreover, under mild conditions, we show that the method is globally convergent if the Armijo line search or the Wolfe line sea...
متن کاملTwo new conjugate gradient methods based on modified secant equations
Following the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradient methods for unconstrained optimization problems. One of our proposedmethods is based on a modified version of the secant equation proposed by Zhang, Deng and Chen, and Zhang and Xu, and the other is based on the modified BFGS update proposed by Yuan. An interesting feature of our methods is their acco...
متن کاملModification of the Wolfe Line Search Rules to Satisfy the Descent Condition in the Polak-Ribière-Polyak Conjugate Gradient Method1
This paper proposes a line search technique to satisfy a relaxed form of the strong Wolfe conditions in order to guarantee the descent condition at each iteration of the Polak-Ribière-Polyak conjugate gradient algorithm. It is proved that this line search algorithm preserves the usual convergence properties of any descent algorithm. In particular, it is shown that the Zoutendijk condition holds...
متن کاملA Modified Conjugate Gradient Method for Unconstrained Optimization
Conjugate gradient methods are an important class of methods for solving unconstrained optimization problems, especially for large-scale problems. Recently, they have been studied in depth. In this paper, we further study the conjugate gradient method for unconstrained optimization. We focus our attention to the descent conjugate gradient method. This paper presents a modified conjugate gradien...
متن کامل